An Analysis of the Local Optima Storage Capacity of Hopfield Network Based Fitness Function Models
نویسندگان
چکیده
A Hopfield Neural Network (HNN) with a new weight update rule can be treated as a second order Estimation of Distribution Algorithm (EDA) or Fitness Function Model (FFM) for solving optimisation problems. The HNN models promising solutions and has a capacity for storing a certain number of local optima as low energy attractors. Solutions are generated by sampling the patterns stored in the attractors. The number of attractors a network can store (its capacity) has an impact on solution diversity and, consequently solution quality. This paper introduces two new HNN learning rules and presents the Hopfield EDA (HEDA), which learns weight values from samples of the fitness function. It investigates the attractor storage capacity of the HEDA and shows it to be equal to that known in the literature for a standard HNN. The relationship between HEDA capacity and linkage order is also investigated.
منابع مشابه
On the Capacity of Hopfield Neural Networks as EDAs for Solving Combinatorial Optimisation Problems
Multi-modal optimisation problems are characterised by the presence of either local sub-optimal points or a number of equally optimal points. These local optima can be considered as point attractors for hill climbing search algorithms. It is desirable to be able to model them either to avoid mistaking a local optimum for a global one or to allow the discovery of multiple equally optimal solutio...
متن کاملExplorations of fitness landscapes of a Hopfield associative memory with random and evolutionary walks
| We apply evolutionary computations to the Hop eld's neural network model of associative memory. In the model, some of the appropriate con gurations of the synaptic weights give the network a function of associative memory. One of our goals is to obtain the distribution of these optimal con gurations as the global optima in the synaptic weight space as well as the information of local optima c...
متن کاملA Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network
Abstract Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...
متن کاملمحاسبه ظرفیت شبکه عصبی هاپفیلد و ارائه روش عملی افزایش حجم حافظه
The capacity of the Hopfield model has been considered as an imortant parameter in using this model. In this paper, the Hopfield neural network is modeled as a Shannon Channel and an upperbound to its capacity is found. For achieving maximum memory, we focus on the training algorithm of the network, and prove that the capacity of the network is bounded by the maximum number of the ortho...
متن کاملPhase Transitions of an Oscillator Neural Network with a Standard Hebb Learning Rule
Studies have been made on the phase transition phenomena of an oscillator network model based on a standard Hebb learning rule like the Hopfield model. The relative phase informations—the in-phase and anti-phase, can be embedded in the network. By self-consistent signal-to-noise analysis (SCSNA), it was found that the storage capacity is given by αc = 0.042, which is better than that of Cook’s ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Trans. Computational Collective Intelligence
دوره 17 شماره
صفحات -
تاریخ انتشار 2014